Training Relation Embeddings under Logical Constraints
نویسندگان
چکیده
We present ways of incorporating logical rules into the construction of embedding based Knowledge Base Completion (KBC) systems. Enforcing “logical consistency” in the predictions of a KBC system guarantees that the predictions comply with logical rules such as symmetry, implication and generalized transitivity. Our method encodes logical rules about entities and relations as convex constraints in the embedding space to enforce the condition that the score of a logically entailed fact must never be less than the minimum score of an antecedent fact. Such constraints provide a weak guarantee that the predictions made by our KBC model will match the output of a logical knowledge base for many types of logical inferences. We validate our method via experiments on a knowledge graph derived from WordNet.
منابع مشابه
Chains of Reasoning over Entities, Relations, and Text using Recurrent Neural Networks
Our goal is to combine the rich multi-step inference of symbolic logical reasoning together with the generalization capabilities of vector embeddings and neural networks. We are particularly interested in complex reasoning about the entities and relations in knowledge bases. Recently Neelakantan et al. (2015) presented a compelling methodology using recurrent neural networks (RNNs) to compose t...
متن کاملLearning First-Order Logic Embeddings via Matrix Factorization
Many complex reasoning tasks in Artificial Intelligence (including relation extraction, knowledge base completion, and information integration) can be formulated as inference problems using a probabilistic first-order logic. However, due to the discrete nature of logical facts and predicates, it is challenging to generalize symbolic representations and represent first-order logic formulas in pr...
متن کاملInjecting Logical Background Knowledge into Embeddings for Relation Extraction
Matrix factorization approaches to relation extraction provide several attractive features: they support distant supervision, handle open schemas, and leverage unlabeled data. Unfortunately, these methods share a shortcoming with all other distantly supervised approaches: they cannot learn to extract target relations without existing data in the knowledge base, and likewise, these models are in...
متن کاملImproving Implicit Discourse Relation Recognition with Discourse-specific Word Embeddings
We introduce a simple and effective method to learn discourse-specific word embeddings (DSWE) for implicit discourse relation recognition. Specifically, DSWE is learned by performing connective classification on massive explicit discourse data, and capable of capturing discourse relationships between words. On the PDTB data set, using DSWE as features achieves significant improvements over base...
متن کاملLearning Task-specific Bilexical Embeddings
We present a method that learns bilexical operators over distributional representations of words and leverages supervised data for a linguistic relation. The learning algorithm exploits lowrank bilinear forms and induces low-dimensional embeddings of the lexical space tailored for the target linguistic relation. An advantage of imposing low-rank constraints is that prediction is expressed as th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017